Markov’s process

Markov’s process
процесс Маркова; вероятностная модель, предложенная А.А. Марковым для исследования реальных систем.

Англо-русский словарь по социологии. 2011.

Игры ⚽ Нужно решить контрольную?

Смотреть что такое "Markov’s process" в других словарях:

  • Markov decision process — Markov decision processes (MDPs), named after Andrey Markov, provide a mathematical framework for modeling decision making in situations where outcomes are partly random and partly under the control of a decision maker. MDPs are useful for… …   Wikipedia

  • Markov additive process — In applied probability, a Markov additive process (MAP) {(X(t),J(t)) : t ≥ 0} is a bivariate Markov process whose transition probability measure is translation invariant in the additive component X(t). That is to say, the… …   Wikipedia

  • Markov jump process — noun A time dependent variable that starts in an initial state and stays in that state for a random time, when it makes a transition to another random state, and so on …   Wiktionary

  • Partially observable Markov decision process — A Partially Observable Markov Decision Process (POMDP) is a generalization of a Markov Decision Process. A POMDP models an agent decision process in which it is assumed that the system dynamics are determined by an MDP, but the agent cannot… …   Wikipedia

  • Markov chain — A simple two state Markov chain. A Markov chain, named for Andrey Markov, is a mathematical system that undergoes transitions from one state to another, between a finite or countable number of possible states. It is a random process characterized …   Wikipedia

  • Markov process — In probability theory and statistics, a Markov process, named after the Russian mathematician Andrey Markov, is a time varying random phenomenon for which a specific property (the Markov property) holds. In a common description, a stochastic… …   Wikipedia

  • Markov property — In probability theory and statistics, the term Markov property refers to the memoryless property of a stochastic process. It was named after the Russian mathematician Andrey Markov.[1] A stochastic process has the Markov property if the… …   Wikipedia

  • Markov model — In probability theory, a Markov model is a stochastic model that assumes the Markov property. Generally, this assumption enables reasoning and computation with the model that would otherwise be intractable. Contents 1 Introduction 2 Markov chain… …   Wikipedia

  • Markov process — Mark ov pro cess, n. [after A. A. Markov, Russian mathematician, b. 1856, d. 1922.] (Statistics) a random process in which the probabilities of states in a series depend only on the properties of the immediately preceding state or the next… …   The Collaborative International Dictionary of English

  • Markov chain — Mark ov chain, n. [after A. A. Markov, Russian mathematician, b. 1856, d. 1922.] (Statistics) A random process (Markov process) in which the probabilities of discrete states in a series depend only on the properties of the immediately preceding… …   The Collaborative International Dictionary of English

  • Markov — Markov, Markova, or Markoff are surnames and may refer to: In academia: Ivana Markova (born 1938), Czechoslovak British emeritus professor of psychology at the University of Stirling John Markoff (sociologist) (born 1942), American professor of… …   Wikipedia


Поделиться ссылкой на выделенное

Прямая ссылка:
Нажмите правой клавишей мыши и выберите «Копировать ссылку»